Tilted Variational Bayes

نویسندگان

  • James Hensman
  • Max Zwiessele
  • Neil D. Lawrence
چکیده

We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP. Code to reproduce all the experiments can be found at github.com/SheffieldML/TVB.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Alternative View of Variational Bayes and Minimum Variational Stochastic Complexity

Bayesian learning is widely used in many applied datamodelling problems and is often accompanied with approximation schemes since it requires intractable computation of the posterior distributions. In this study, we focus on the two approximation methods, the variational Bayes and the local variational approximation. We show that the variational Bayes approach for statistical models with latent...

متن کامل

On Variational Bayes Algorithms for Exponential Family Mixtures

In this paper, we empirically analyze the behaviors of the Variational Bayes algorithm for the mixture model. While the Variational Bayesian learning has provided computational tractability and good generalization performance in many applications, little has been done to investigate its properties. Recently, the stochastic complexity of mixture models in the Variational Bayesian learning was cl...

متن کامل

Variational Bayes Estimation of Mixing Coefficients

We investigate theoretically some properties of variational Bayes approximations based on estimating the mixing coefficients of known densities. We show that, with probability 1 as the sample size n grows large, the iterative algorithm for the variational Bayes approximation converges locally to the maximum likelihood estimator at the rate of O(1/n). Moreover, the variational posterior distribu...

متن کامل

Supplementary Material for Adversarial Variational Bayes: Unifying Variational Autoencoders and Generative Adversarial Networks

In the main text we derived Adversarial Variational Bayes (AVB) and demonstrated its usefulness both for black-box Variational Inference and for learning latent variable models. This document contains proofs that were omitted in the main text as well as some further details about the experiments and additional results.

متن کامل

Semiparametric Mean Field Variational Bayes: General Principles and Numerical Issues

We introduce the term semiparametric mean field variational Bayes to describe the relaxation of mean field variational Bayes in which some density functions in the product density restriction are pre-specified to be members of convenient parametric families. This notion has appeared in various guises in the mean field variational Bayes literature during its history and we endeavor to unify this...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014